33 research outputs found

    Overcoming Restraint: Composing Verification of Foreign Functions with Cogent

    Get PDF
    Cogent is a restricted functional language designed to reduce the cost of developing verified systems code. Because of its sometimes-onerous restrictions, such as the lack of support for recursion and its strict uniqueness type system, Cogent provides an escape hatch in the form of a foreign function interface (FFI) to C code. This poses a problem when verifying Cogent programs, as imported C components do not enjoy the same level of static guarantees that Cogent does. Previous verification of file systems implemented in Cogent merely assumed that their C components were correct and that they preserved the invariants of Cogent's type system. In this paper, we instead prove such obligations. We demonstrate how they smoothly compose with existing Cogent theorems, and result in a correctness theorem of the overall Cogent-C system. The Cogent FFI constraints ensure that key invariants of Cogent's type system are maintained even when calling C code. We verify reusable higher-order and polymorphic functions including a generic loop combinator and array iterators and demonstrate their application to several examples including binary search and the BilbyFs file system. We demonstrate the feasibility of verification of mixed Cogent-C systems, and provide some insight into verification of software comprised of code in multiple languages with differing levels of static guarantees

    Verification of program computations

    Get PDF
    Formal verification of complex algorithms is challenging. Verifying their implementations in reasonable time is infeasible using current verification tools and usually involves intricate mathematical theorems. Certifying algorithms compute in addition to each output a witness certifying that the output is correct. A checker for such a witness is usually much simpler than the original algorithm -- yet it is all the user has to trust. The verification of checkers is feasible with current tools and leads to computations that can be completely trusted. We describe a framework to seamlessly verify certifying computations. We demonstrate the effectiveness of our approach by presenting the verification of typical examples of the industrial-level and widespread algorithmic library LEDA. We present and compare two alternative methods for verifying the C implementation of the checkers. Moreover, we present work that was done during an internship at NICTA, Australia\u27s Information and Communications Technology Research Centre of Excellence. This work contributes to building a feasible framework for verifying efficient file systems code. As opposed to the algorithmic problems we address in this thesis, file systems code is mostly straightforward and hence a good candidate for automation.Die formale Verifikation der Implementierung komplexer Algorithmen ist schwierig. Sie ĂŒbersteigt die Möglichkeiten der heutigen Verifikationswerkzeuge und erfordert fĂŒr gewöhnlich komplexe mathematische Theoreme. Zertifizierende Algorithmen berechnen zu jeder Ausgabe ein Zerfitikat, das die Korrektheit der Antwort bestĂ€tigt. Ein Checker fĂŒr ein solches Zertifikat ist normalerweise ein viel einfacheres Programm und doch muss ein Nutzer nur dem Checker vertrauen. Die Verifizierung von Checkern ist mit den heutigen Werkzeugen möglich und fĂŒhrt zu Berechnungen, denen völlig vertraut werden kann. Wir beschreiben eine Rahmenstruktur zur Verifikation zertifizierender Berechnungen und demonstrieren die EffektivitĂ€t unseres Ansatzes an Hand typischer Beispiele aus der hochqualitĂ€tiven und oft eingesetzten LEDA Algorithmenbibliothek. We prĂ€sentieren und bewerten zwei alternative Methoden zur Verifikation von Checkerimplementierungen in C. Desweiteren beschreiben wir Ergebnisse, die wĂ€hrend eines Praktikums am NICTA, dem Australischen Forschungszentrum fĂŒr Informations- und Kommunikationstechnik, erzielt wurden. Diese Arbeit trĂ€gt zum Aufbau einer praktisch einsetzbaren Rahmenstruktur zur Verifizierung von Code fĂŒr effiziente Dateisysteme bei. Im Gegensatz zu den algorithmischen Problemen, die wir in dieser Arbeiten behandeln, ist der Code fĂŒr Dateisysteme weitgehend unkompliziert unddaher ein guter Kandidat zur Automatisierung

    Prediction and Sampling with Local Graph Transforms for Quasi-Lossless Light Field Compression

    Get PDF
    International audienc

    Rate-Distortion Optimized Super-Ray Merging for Light Field Compression

    Get PDF
    International audienceIn this paper, we focus on the problem of compressing dense light fields which represent very large volumes of highly redundant data. In our scheme, view synthesis based on convolutional neural networks (CNN) is used as a first prediction step to exploit interview correlation. Super-rays are then constructed to capture the interview and spatial redundancy remaining in the prediction residues. To ensure that the super-ray segmentation is highly correlated with the residues to be encoded, the super-rays are computed on synthesized residues (the difference between the four transmitted corner views and their corresponding synthesized views), instead of the synthesized views. Neighboring super-rays are merged into a larger super-ray according to a rate-distortion cost. A 4D shape adaptive discrete cosine transform (SA-DCT) is applied per super-ray on the prediction residues in both the spatial and angular dimensions. A traditional coding scheme consisting of quantization and entropy coding is then used for encoding the transformed coefficients. Experimental results show that the proposed coding scheme outperforms HEVC-based schemes at low bitrate

    Cogent: uniqueness types and certifying compilation

    Get PDF
    This paper presents a framework aimed at significantly reducing the cost of proving functional correctness for low-level operating systems components. The framework is designed around a new functional programming language, Cogent. A central aspect of the language is its uniqueness type system, which eliminates the need for a trusted runtime or garbage collector while still guaranteeing memory safety, a crucial property for safety and security. Moreover, it allows us to assign two semantics to the language: The first semantics is imperative, suitable for efficient C code generation, and the second is purely functional, providing a user-friendly interface for equational reasoning and verification of higher-level correctness properties. The refinement theorem connecting the two semantics allows the compiler to produce a proof via translation validation certifying the correctness of the generated C code with respect to the semantics of the Cogent source program. We have demonstrated the effectiveness of our framework for implementation and for verification through two file system implementations
    corecore